Relaxed Extragradient Algorithms for the Split Feasibility Problem

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relaxed Implicit Extragradient-like Methods for Finding Minimum-norm Solutions of the Split Feasibility Problem

In this paper, we consider the split feasibility problem (SFP) in infinite-dimensional Hilbert spaces, and study the relaxed implicit extragradient-like methods for finding a common element of the solution set Γ of the SFP and the set Fix(S) of fixed points of a nonexpansive mapping S. Combining Mann’s implicit iterative method and Korpelevich’s extragradient method, we propose two implicit ite...

متن کامل

A Splitting-relaxed Projection Method for Solving the Split Feasibility Problem

The split feasibility problem (SFP) is to find x ∈ C so that Ax ∈ Q, where C is a nonempty closed convex subset of Rn, Q is a nonempty closed convex subset of Rm, and A is a matrix from Rn into Rm. One of successful methods for solving the SFP is Byrne’s CQ algorithm. However, to carry out the CQ algorithm, it is required that the closed convex subsets are simple and that the matrix norm is kno...

متن کامل

Self-Adaptive and Relaxed Self-Adaptive Projection Methods for Solving the Multiple-Set Split Feasibility Problem

and Applied Analysis 3 half-spaces, so that the algorithm is implementable. We need not estimate the Lipschitz constant and make a sufficient decrease of the objection function at each iteration; besides, these projection algorithms can reduce to the modifications of the CQ algorithm 6 when the MSSFP 1.1 is reduced to the SFP. We also show convergence the algorithms under mild conditions.

متن کامل

Regularized Methods for the Split Feasibility Problem

and Applied Analysis 3 However, 1.8 is, in general, ill posed. So regularization is needed. We consider Tikhonov’s regularization min x∈C fα : 1 2 ∥I − PQ ) Ax ∥∥2 1 2 α‖x‖, 1.9 where α > 0 is the regularization parameter. We can compute the gradient ∇fα of fα as ∇fα ∇f x αI A∗ ( I − PQ ) A αI. 1.10 Define a Picard iterates x n 1 PC ( I − γA∗I − PQ ) A αI )) x n 1.11 Xu 20 shown that if the SFP...

متن کامل

the algorithm for solving the inverse numerical range problem

برد عددی ماتریس مربعی a را با w(a) نشان داده و به این صورت تعریف می کنیم w(a)={x8ax:x ?s1} ، که در آن s1 گوی واحد است. در سال 2009، راسل کاردن مساله برد عددی معکوس را به این صورت مطرح کرده است : برای نقطه z?w(a)، بردار x?s1 را به گونه ای می یابیم که z=x*ax، در این پایان نامه ، الگوریتمی برای حل مساله برد عددی معکوس ارانه می دهیم.

15 صفحه اول

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Applied Mathematics

سال: 2014

ISSN: 1110-757X,1687-0042

DOI: 10.1155/2014/468079